EHR phenotyping via jointly embedding medical concepts and words into a unified vector space
نویسندگان
چکیده
منابع مشابه
Generic object recognition using graph embedding into a vector space
This paper describes a method for generic object recognition using graph structural expression. In recent years, generic object recognition by computer is finding extensive use in a variety of fields, including robotic vision and image retrieval. Conventional methods use a bag-of-features (BoF) approach, which expresses the image as an appearance frequency histogram of visual words by quantizin...
متن کاملVector Embedding of Wikipedia Concepts and Entities
Using deep learning for different machine learning tasks such as image classification and word embedding has recently gained many attentions. Its appealing performance reported across specific Natural Language Processing (NLP) tasks in comparison with other approaches is the reason for its popularity. Word embedding is the task of mapping words or phrases to a low dimensional numerical vector. ...
متن کاملEHR Big Data Deep Phenotyping
Objectives: Given the quickening speed of discovery of variant disease drivers from combined patient genotype and phenotype data, the objective is to provide methodology using big data technology to support the definition of deep phenotypes in medical records. Methods: As the vast stores of genomic information increase with next generation sequencing, the importance of deep phenotyping increase...
متن کاملJointly Sparse Vector Recovery via Reweighted
An iterative reweighted algorithm is proposed for the recovery of jointly sparse vectors from multiple-measurement vectors (MMV). The proposed MMV algorithm is an extension of the iterative reweighted 1 algorithm for single measurement problems. The proposed algorithm (M-IRL1) is demonstrated to outperform non-reweighted MMV algorithms under noiseless measurements. A regularization of the M-IRL...
متن کاملMore On Embedding an Affine Space in a Vector Space
for all a1 . . . , am ∈ E, all v1, . . . , vm ∈ −→E , and all λ1, . . . , λm ∈ R. Furthermore, for λi = 0, 1 ≤ i ≤ m, we have f̂(v1 +̂ λ1a1, . . . , vm +̂ λmam) = λ1 · · ·λmf(a1 + λ−1 1 v1, . . . , am + λ−1 m vm). Proof . Let us assume that f̂ exists. We first prove by induction on k, 1 ≤ k ≤ m, that f̂(a1, . . . , vi1 , . . . , vik , . . . , am) = fS(vi1 , . . . , vik), for every S ⊆ {1, . . . ,m},...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: BMC Medical Informatics and Decision Making
سال: 2018
ISSN: 1472-6947
DOI: 10.1186/s12911-018-0672-0